![]() System and method for non-contact biometric authentication.
专利摘要:
The invention relates to a biometric authentication system for acquiring biometric characteristics of a body part of a person and for comparing the acquired biometric characteristics with registered biometric characteristics already registered by the body part for authenticating the person, comprising: a lighting unit (110) for illuminating the body part; an image capture unit (114) for capturing an image of the illuminated body part; a processing unit (118) for extracting the biometric characteristic data from the image acquired by the image capture unit (114); and an authentication unit (120) for comparing the extracted biometric data with registered characteristics to authenticate the person; wherein the lighting unit (110) is provided with at least one pattern mask (112) for projecting a mask pattern onto the body part such that the captured image comprises the projected mask pattern. 公开号:CH713061A1 申请号:CH01404/16 申请日:2016-10-19 公开日:2018-04-30 发明作者:Bergqvist Johan 申请人:Smart Secure Id In Sweden Ab; IPC主号:
专利说明:
description Technical field of the invention The present invention relates to the field of biometric authentication and in particular to a system and method for contactless biometric authentication, the authentication method also checking the liveliness of the person to be authenticated. General state of the art Authentication methods have found widespread use in the field of biometric authentication. The need to authenticate people has been around for a long time. The events that trigger such a need include financial transactions, entering foreign countries, dialing, taking an exam, starting a business, accessing buildings, etc. Biometric authentication methods use the specific and unique features of a particular body part of a person , Fortunately, people consist of complex organ systems that have a high degree of uniqueness. The most common organic feature used for authentication purposes is fingerprints. Other procedures use the iris of the human eye, blood vessel patterns under the skin of the hand or face. These methods are usually based on image processing and detection. Still other methods involve using the voice. One of the most commonly used biometric authentications is based on the fingerprint, but there is a risk of circumvention through the use of fake / artificial fingerprints. Another problem with fingerprint authentication is that the user must be in touch with the surface, which can result in hygiene issues if the same authentication device is used by multiple people or if dirt or grease deposited on the surface can cause the authentication device to malfunction. Other authentication methods, also non-contact, use the handprint, palm veins or finger vein images. Examples are known from US 2008 0 107 309, EP 2 244 224, US 9 355 236, EP 1 612 718, US 9 223 955, US 6 813 010, US 7 359 531, EP 1 612 717 or EP 1 387 309. [0005] There is a need for contactless biometric authentication, the authentication method also checking the liveliness of the person to be authenticated. Some systems use 3D cameras or multiple cameras or even modulated or structured projector light sources for these processes. Because these systems and devices are typically very complex, there is still a need to perform these operations in a more economical and robust manner with minimal system design. Summary of the Invention It is an object of the invention to provide a system, an apparatus and a method for performing non-contact biometric authentication, which has the possibility of checking the liveliness, in a cost-efficient and robust manner. [0007] This is achieved by a biometric authentication system according to claim 1, a biometric authentication method according to claim 8 and a biometric authentication device according to claim 13. The biometric authentication system for acquiring biometric characteristics of a body part of a person and comparing the acquired biometric characteristics with registered biometric characteristics of this body part already authenticated for authenticating the person, comprising: an illuminating unit for illuminating the body part; an image capturing unit for capturing an image of the illuminated body part; a processing unit for extracting the biometric characteristics from the image captured by the image capturing unit; and an authentication unit for comparing the extracted biometric data with registered identification data for authenticating the person. The lighting unit of the system is provided with at least one pattern mask in order to project a mask pattern onto the body part, so that the mask pattern is part of the captured image. The projected mask pattern, which preferably consists of identically shaped polygons of the same area, can be used to find the optimal distance between the body part and the lighting or detection unit in a completely non-contact manner by using the average size of the individual pattern shapes projected onto the body part the size of the individual sample shapes obtained during an earlier calibration step. The same mask pattern that is projected onto the body part and is therefore part of the captured image can also be used to define an area of interest of the body part and to extract the biometric characteristic data by determining the individual pattern shapes, the structures of interest of the body part ( for example handprint and palm veins). In addition, the same mask patterns can be used to perform a vitality test based on the unevenness of the body part. Bumps in the body part distort the mask pattern. This distortion can hardly be reconstructed from a fake part of the body and can therefore be used as an indication of the vitality of a person to be authenticated. CH 713 061 A1 [0010] Further embodiments of the invention are set out in the dependent claims. In some embodiments, the pattern mask may have a regular pattern of polygons, preferably triangles, quadrilaterals, squares, pentagons or hexagons. This means that the polygons have identical shapes and areas. In some embodiments, the lighting unit can comprise at least one light source of visible light and / or at least one light source near-infrared light (NIR), the light source preferably comprising a plurality of light-emitting diodes (LEDs). The light source or LEDs can be arranged in a circle around the detection unit. Different light sources can be arranged alternately. The lighting unit preferably comprises light sources for visible and for NIR light, for example to detect hand prints / wrinkles or palm veins. In this way, the authentication analysis can be based on polygons that include palm print and palm veins. [0013] In some embodiments, the biometric authentication system may further comprise an optical guidance unit, which is able to determine the optimal distance between the body part and the detection unit based on the mask pattern projected onto the body part. The optical guide unit can indicate the optimal distance, for example by a green light. If the distance to the lighting / detection unit is too small or too large, a red light can light up. In addition, when the optimal distance is approached, a gradual change from red to yellow to green can be provided. In this way, the user receives information about whether he is moving his hand in the right direction. [0014] In some embodiments, the detection unit may comprise a microlens arrangement for obtaining depth information of the body part. The microlens arrangement is an optional function, since the mask pattern projected onto the body part already provides certain depth information due to the distortion of the pattern. Furthermore, the invention relates to a method for acquiring biometric characteristic data of a body part of a person and for comparing the acquired biometric characteristic data with registered characteristic data already acquired by this body part for authenticating the person, comprising the steps: illuminating the body part using a lighting unit, the is provided with at least one pattern mask to project a mask pattern onto the body part; Capturing an image of the illuminated body part comprising the projected mask pattern using an image capturing unit; Extracting the biometric characteristics from the image captured by the capturing unit using a processing unit; Compare the extracted biometric data with registered identification data for authenticating the person using an authentication unit. In some embodiments, the pattern mask can have a regular pattern of polygons, preferably triangles, quadrilaterals, squares, pentagons or hexagons, and is used to determine the optimal distance between the body part and the detection unit. [0017] In some embodiments, the mask pattern is used to guide the body part in order to determine an optimal distance between the body part and the detection unit. This can be done by comparing the average size / area of the polygons of the projected mask pattern with the size / area of the polygons that was obtained during a calibration step. In some embodiments, the lighting unit can illuminate the body part with visible light and NIR light in order to obtain combined images that show both structures of the body part that are under visible light (eg handprint) or under NIR light (eg palm veins). are recognizable, as well as include the mask pattern. The combined image is then used to extract the biometric data. [0019] In some embodiments, the mask pattern can be used to extract the biometric characteristics. For example, the mask pattern divides the image into several small areas / areas that can be analyzed for the presence of biometric data (e.g. handprint and / or palm veins). The invention further relates to a biometric authentication device for capturing images to carry out the method described above. The device comprises a lighting unit for illuminating the body part; an image capturing unit for capturing an image of the illuminated body part; wherein the lighting unit is provided with at least one pattern mask in order to project a mask pattern onto the body part, so that the captured image comprises the projected mask pattern. [0021] In some embodiments, the device further comprises a processing unit or means for connecting the device to a processing unit. The processing unit extracts the biometric characteristic data from the image captured by the capture unit. [0022] In some embodiments, the device further comprises an authentication unit or means for connecting the device to an authentication unit. The authentication unit compares the extracted biometric data with registered identification data for authenticating the person. [0023] According to another embodiment of the invention, the authentication can be started or signaled by up and down movement or forward / backward movement or lateral movement of the palm of the hand or any independent pivoting, pitching or rolling movement. Then the area of interest is defined as such CH 713 061 A1 which has the presence of palm folds and palm veins, which are then detected in visible and NIR light, as shown according to an embodiment of the invention. According to another embodiment of the invention, the stereoscopic effect creates a 3D model of the palm when the hand is moved up and down or in any of the panning, pitching or rolling movements, thereby creating a series of images with different LEDs - Patterns in different focal lengths are created to create a 3D image of the palm and veins. This can also be used for the vitality test. [0025] According to another embodiment of the invention, an area of interest is determined by the following steps. The first step is to determine the wrist position of the hand to be authenticated. Once the wrist position has been determined, the areas are displayed in the form of a ray. In another scenario, the next step after determining the wrist position is determining the thumb position and radiating the areas to connect the thumb to the other fingers and to connect all the joints together. According to another embodiment of the invention, the palm is in a first position in which the side of the palm is visible to the sensor, i.e. the back of the hand and the side of the palm are at right angles to the sensor. The palm of the hand is turned in such a way that it is arranged parallel to the surface of the sensor. When performing the rotation, the outermost visible points become the area of interest while bending the palm. This can also be used to conduct a vitality test. According to another embodiment of the invention, for a liveliness test, the user is asked to close his palm into a fist and the extraction of the area of interest is marked along the closing finger grooves on the top to draw a square. This action is used as a vitality test whenever there is a risk of incorrect palm authentication. When attempting to hack the system using a fake / artificial palm, a user may be asked to perform this gesture as a liveliness test. According to another embodiment of the invention, the user is asked for a vitality test to open and close his fingers and the extraction of the area of interest is marked along the active finger grooves on the top to draw a polygon. This action is used as a vitality test whenever there is a risk of incorrect palm authentication. When attempting to hack the system using a fake / artificial palm, a user may be asked to perform this gesture as a liveliness test. [0029] The present invention provides a non-contact biometric authentication system. To ensure a more hygienic procedure, the present invention does not require the user to put his hand on a contact surface, so the authentication procedure guarantees that the user does not have to touch the device anywhere during the authentication process. Furthermore, the invention discloses various methods for checking the vitality of the person authenticating himself. This helps prevent fraudulent practices using hand or finger biometric authentication. Brief Explanation of the Figures The foregoing and other features of embodiments will become more apparent from the following detailed description of the embodiments in conjunction with the accompanying drawings. Elements in the figures do not necessarily appear drawn to scale to increase clarity and understanding of these various elements and embodiments of the invention. Therefore, for the sake of clarity and clarity, the drawings have a generalized form, whereby: 1 illustrates a block diagram of a system for contactless biometric authentication. 2 shows a device comprising a lighting unit and a detection unit. 3a shows an image acquired by using an NIR light source without a pattern mask. 3b illustrates an image with a pattern mask captured using an NIR light source. 4 shows a calibration system in which the optimal position of the hand is determined. Figure 5a illustrates a way in which a palm is positioned in front of the lighting unit, and also the way in which the light is scattered from the palm. 5b shows a function of an optical guide unit. 6 shows different positions in which the palm can be moved. 7 illustrates a method of determining an area of interest of a palm. CH 713 061 A1 8 illustrates a method in which the features of a palm are extracted in accordance with an embodiment of the invention. 9 shows various feature points in the area of the palm. 10 illustrates a neural network of a plurality of neurons. 11 shows a mask pattern made of hexagons (left) and a distorted mask pattern due to uneven body parts (right). Figure 12 illustrates one possible way to define pattern blocks using the pattern mask. Description of the Invention In the following detailed description, reference is made to the accompanying drawings, which form a part thereof, and in which certain embodiments that can be carried out are shown by way of illustration. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is believed that other embodiments may be used and that logical, mechanical, and other changes may be made within the scope of the embodiments. The following detailed description is, therefore, not to be taken as limiting the scope of the invention; instead, the invention is to be defined by the appended claims. The present invention avoids the disadvantages of the prior art by providing a contactless biometric authentication system, which is used to authenticate the person to be authenticated without using 3D cameras or a projector system and also to check their liveliness. 1 illustrates a block diagram of a system for contactless biometric authentication in accordance with an embodiment of the invention. The system comprises at least one lighting unit 110 comprising at least one light source of visible light and / or at least one light source of near infrared light (NIR) , The light source can comprise a plurality of light emitting diodes (LEDs). The LEDs can be arranged in a circle. Visible light LEDs and NIR light LEDs can be arranged alternately. [0034] The lighting unit 110 or its individual light sources are masked with at least one pattern 112. The pattern is preferably defined as a regular group of polygons of the same area (see Fig. 11, left). The size of a polygon is preferably in the range of 1 mm or has an area of approximately 1 mm 2. Various types or shapes of polygons can be used, for example triangle, square, square, pentagon or hexagon. Other shapes are also possible. Pattern masks on different LEDs can be arranged at different angles to each other. If, for example, a sample mask is arranged on two most distant LEDs in a circular arrangement, these can be used for calculation purposes, since the distances between them are triangulated to calculate the optimal distance of the body part from the lighting or detection unit or to determine the Form surface area of the body part. The LED lights with different pattern masks are used to indicate the optimal distance at which the hand must be placed over the sensor of the registration unit. This arrangement also helps to define the area of interest by clearly recognizing the hand against the rest of the background. Furthermore, the system comprises an image acquisition unit 114 for acquiring the image of a body part, for example a hand, of the person to be authenticated. The image acquisition unit 114 has a camera and one or more lenses 116. The detection unit can have a microlens arrangement in order to detect depth information of the body part, which provide images with different focal lengths or deformations of the surface, and thus 3D information. The system also includes a processing unit 118 for processing the images and extracting various biometric characteristics, such as palm folds / imprints and / or palm veins, from the captured image. The system also has an authentication unit 120, the authentication unit 120 comparing the biometric characteristic data extracted from the captured image with previously extracted and registered characteristic data for authenticating the person. The registered data can be stored in a storage unit of the authentication unit or in a separate storage unit. The described units can all be part of a single device. The processing unit, the authentication unit or the storage unit can in each case also be separated from the device and connected to one another or to the device via communication means. 2 shows a device comprising an illumination unit 110 together with an image acquisition unit 114. In this case, the light sources are arranged in a circular ring. The LEDs with NIR and those with a visible light source are arranged alternately. [0037] In accordance with one embodiment, the invention further discloses a method for non-contact biometric authentication. Authentication is performed by capturing images by an image capture unit 114. The image capture unit 114 can use a 2D camera (hereinafter referred to as a camera). The camera captures the masked image of the hand from the masked light source. If it has a microlens array, CH 713 061 A1, the camera can also acquire a depth image by building up the image created by the microlens arrangement (FIG. 11). The microlens arrangement is used to extract different focal lengths of the same image and to extract the depth image or surface deformity information of the body part therefrom. Fig. 3a shows an image captured by an NIR light source without a pattern mask, while Fig. 3b shows the image captured by a pattern mask. When using a pattern mask, a mask pattern appears as a geometric pattern that can be predetermined, for example, on a flat surface. A manual or automatic calibration test can be carried out for the camera without the presence of the body part in order to optimally image the area of interest by arranging a flat surface at the desired distance from the illumination unit and the detection unit. After the manual or automatic calibration, a visual guide instrument or a guide unit indicates the distance at which the hand should be placed based on the predetermined geometric pattern size of the hand. The management unit is explained in more detail below. 4 shows a calibration system in which the optimal position of the palm is determined on the basis of the mask pattern projected onto the palm. Here, a contactless biometric authentication device 100 comprising at least the lighting unit 110 with a pattern mask 112 and the detection unit is placed at a predetermined position with respect to a frame 511 of a calibration device 500. The rack 511 holds a pre-attached flat plate 512 a known distance from the predetermined position of the authentication device. The image processing unit detects the mask pattern on a prefabricated flat plate 512 at a known distance. The image processing unit 118 detects the mask pattern and stores the pattern information (pattern area, pattern size, etc.) for future recognition of the optimal distance of a palm for authentication. When a palm is later placed in front of the authentication device, the image processing unit 118 detects the mask pattern on the palm, extracts the pattern information and compares it with the pattern information previously stored during the calibration. After the calibration, the optical guide unit shows the height at which the palm of the hand must be placed based on the size change of the mask pattern, which is compared with the previously stored calibration mask pattern. If the palm is too close to (FIG. 5b, middle) or too far from (FIG. 5b, left) the detection unit, a red-colored light appears, preferably projected onto the palm. When the palm is placed at the optimal distance (the actual mask pattern size corresponds to the calibration mask pattern size), a green-colored light appears (FIG. 5b, right), preferably projected onto the palm. The user notices the color that is reflected when there is a gap between the fingers, around the fingers, or around the entire palm. Or, if there is no gap between the fingers, the user will notice the color of the light from the light reflected from the palm of the hand back to the sensor area that the user is viewing in side or profile view. In addition, the colored light can gradually change from red to yellow and finally to green, which indicates whether the palm is being moved in the right direction away from or towards the detection unit. This makes it easy to find the correct position. For example, if the color changes from yellow to red, the user realizes that he is moving the palm in the wrong direction. Fig. 5 shows on the left side the way in which a palm is positioned in front of the authentication device 100, and on the right side also the way in which the guide light 611 of the optical guide unit of the Palm is sprinkled. This figure shows the guide light 613 scattered from the palm of the hand. The reflected light is seen around the fingers by the user. The user must change the position of the hand until a green light is reflected on his palm. The green light indicates that the palm's distance from the authentication device is optimal. As soon as the optimal position has been determined, the lighting unit 110 is activated for authentication. The calibration based on the mask pattern enables the authentication device to function without the need for assistance to place the hand in the optimal position. The device therefore works completely without contact. [0043] A user must log on to the authentication system before use. For registration, the user determines the optimal distance from the device in which he is holding his palm, so that the green light appears. As soon as the green light appears, the user who wants to log in moves the hand in different directions, for example to the left, to the right, forwards and backwards. The palm is moved in the plane across the different directions to grasp different parts of the palm. Next time every part of the palm can be used for authentication. A minimum authentication threshold can be set during this process. The threshold value therefore indicates the minimum palm area that is required for a certain authentication to be accepted. Figure 6 illustrates the various positions the palm is moved in accordance with an embodiment of the invention. Once the various parts of the palm are scanned, a single image of the palm is built up. The image includes pattern information and depth information due to the unevenness of the palm (explained in more detail below), which are captured using a microlens array. The image further includes palm vein information acquired under NIR light and handprint information acquired under CH 713 061 A1 visible light have been detected. For example, the palm of Fig. 6 shows only handprints. The palm in Figure 7 shows palm prints and palm veins. 7 shows the determination of an area of interest (ROI = region of interest) for authentication in accordance with an embodiment of the invention. The palm vein endpoints are determined for the entire palm. The palm vein ends determined are followed to the extreme points to form the area of interest. In Fig. 7 (middle) the outermost vein ends, for example 201a, 201b, 201c, 201d, 201e, 201 f and 201g, are marked. The palm has many more vein ends, but only the outermost vein ends are considered to determine the area of interest. Once these ends are identified and registered with their coordinates, the outermost endpoints of veins 201a, 201b, 201c, 201d, 201e, 201f and 201g are tracked to form the area of interest 202. The area of interest for obtaining the biometric characteristic data can thus be determined by the limitations of the palm veins. The mask pattern can also help define the area of interest by clearly recognizing the hand against the rest of the background. [0046] The size of the mask pattern on the palm varies depending on the deformity of the palm. The palm has no flat surface, so that the size of the mask pattern differs slightly in different areas of the palm depending on its distance from the lighting unit. The size of the mask pattern in the captured image can therefore be used as depth information. For example, the mask pattern size in the center of the palm is slightly larger than in the area around the center of the palm, which means that the center is slightly further away from the lighting unit than the areas surrounding it. Thus, the unevenness of the palm results in a distortion of the mask pattern (see Fig. 11, right). This unevenness of the palm and the information obtained thereby can hardly be reproduced with an artificial palm and can therefore be used as an indication of the vitality of a person to be authenticated. Figure 8 illustrates a method in which the palm biometric characteristics are extracted in accordance with an embodiment of the invention. After defining the area of interest, the presence of the handprints and palm veins, which are detected under visible light or NIR light, is determined in a sample block. The pattern block can refer to a polygon or a predefined number of polygons of the pattern mask. One way to define sample blocks is described below. The presence of the palm veins and handprints is determined as in pattern block 301 where a palm vein and a handprint overlap, i.e. cross each other. The system identifies such overlaps and stores the details, such as the overlap of the palm print with the palm vein with its polar coordinates along with the size / area of the polygon or pattern block. The size of the polygon or block of patterns, the polar coordinates and / or depth information acquired using a microlens array, palm veins, and features of the handprint are stored as a weighted mathematical formula called a neuron node. This neuron node recognizes in which direction the next list of connected neurons has to be jumped for the pattern comparison. After capturing the palm image with all this neuron information, the entire list of neuron nodes is saved as a mathematical formula. Fig. 9 illustrates various sample blocks in the previously defined area of interest in accordance with an embodiment of the invention. For further analysis, the system only considers sample blocks in the area of interest which have vein and impression information. If there is only one palm print or only palm veins for a feature point, this block is ignored. Figure 10 illustrates a neural network of multiple neuron nodes in accordance with an embodiment of the invention. Once the features have been extracted, they are presented as a network of neuron nodes, such as Neuron 1 (401), Neuron 2 (402) , Neuron 3 (403), Neuron 4 (404) and Neuron 5 (405). Consider Neuron 1 (401), which has a vein print and a hand print 410a, 410b, 410 c. The handprint information of neuron 1 (401) is connected to neuron 2 (402) by handprint line 410a, neuron 2 (402) further by handprint line 410c to neuron 3 (403), by handprint line 410a to neuron 4 (404) and by handprint line 410b is connected to neuron 5 (405). Neuron 3 (403) is only connected to Neuron 2 (402), with Neuron 4 (404) being connected to Neuron 2 (402). Neuron 5 (405) is connected to Neuron 2 (402) and not to Neuron 1 (401) because the angle of the handprint line points towards Neuron 2 (402). This information is read as 124, which means that neuron 1 (401) can jump from neuron 2 (402) to neuron 4 (404), and 123, which means that neuron 1 (401) can jump from neuron 2 (402) to Neuron 3 (403) can jump. This is repeated for the entire feature neuron list in the area of interest of a palm and presented as a mathematical model. The same process is performed and saved for the palm vein lines. [0051] The extracted features can be stored as weighted neural nodes in a network. The extracted features also include the presence of wrinkles on the palm in relation to the pattern of the mask. In this scenario, the extracted features include the folds of the palm in relation to the patterns present in the mask. Each node is defined as the existence of a pattern and how the palm folds and veins overlap in that pattern, the coordinates of the features, the depth information acquired using a microlens array, and the size of this predetermined geometric pattern. The CH 713 061 A1 Network is expanded by step-wise jumping along the vein or crease wherever a pattern has both a palm vein and a palm fold feature. This is repeated until all features have been extracted and is represented as a mathematically weighted neural network. To compare two palm images in the authentication process, the system executes a multi-stage neuromorphic algorithm in order to determine the degree of agreement with respect to the mathematical model stored in the database. The system therefore uses multi-phase object detection. This multi-phase object recognition system uses extensive arrangements of information-rich neurons to build a biologically plausible model of visual information processing. FIG. 12 shows a possible way of defining pattern blocks 704 by using the pattern mask of the lighting unit. The captured image of the palm comprises palm veins 702, hand prints 703 and the mask pattern 701 projected onto the palm. The mask pattern 701 shown exists from regular polygons a to s. In order to define a pattern block 704 for the further analysis, polygons are determined which palm veins 702 and hand prints 703 contain. In this case, the polygon would be g, which has an intersection of a palm vein 702 and the palm print 703 (FIG. 12, left). The pattern block 704 is then defined by the polygon g and its surrounding polygons b, c, f, h, I and m (FIG. 12, right). [0054] The foregoing description of the preferred embodiment of the present invention has been presented for purposes of illustration and description. It is not to be considered complete and does not limit the invention to the specific form disclosed. Numerous changes and variations are possible in light of the above teachings. This detailed description is not intended to limit the scope of the present invention in any way. [0055] It is understood that the present invention is not limited to the embodiments discussed above. Those skilled in the art will be able to derive further variants with knowledge of the invention, which also belong to the subject of the present invention.
权利要求:
Claims (15) [1] claims A biometric authentication system for acquiring biometric characteristics from a body part of a person and comparing the acquired biometric characteristics with registered biometric characteristics already acquired from the body part for authenticating the person, comprising: an illuminating unit (110) for illuminating the body part; an image capture unit (114) for capturing an image of the illuminated body part; a processing unit (118) for extracting the biometric characteristics from the image captured by the image capturing unit (114); and an authentication unit (120) for comparing the extracted biometric data with registered identification data for authenticating the person; characterized in that the lighting unit (110) is provided with at least one pattern mask (112) in order to project a mask pattern (701) onto the body part, so that the projected mask pattern is contained in the captured image. [2] 2. Biometric authentication system according to claim 1, wherein the pattern mask has a regular pattern of polygons, preferably triangles, squares, squares, pentagons or hexagons. [3] 3. Biometric authentication system according to one of the preceding claims, wherein the lighting unit comprises at least one light source of visible light and / or at least one light source near-infrared light (NIR), and wherein the light source preferably comprises a plurality of light-emitting diodes (LEDs). [4] 4. Biometric authentication system according to claim 3, wherein the light sources or LEDs are arranged in a circle around the detection unit. [5] 5. Biometric authentication system according to claim 3 or 4, wherein different light sources are arranged alternately. [6] 6. Biometric authentication system according to one of the preceding claims, wherein the biometric authentication system further comprises an optical guide unit which is able to determine the optimal distance between the body part and the detection unit on the basis of the mask pattern projected onto the body part. [7] 7. Biometric authentication system according to one of the preceding claims, wherein the detection unit comprises a microlens arrangement for obtaining depth information of the body part. [8] 8. A biometric authentication method for acquiring biometric characteristic data from a body part of a person and for comparing the acquired biometric characteristic data with registered biometric characteristic data already acquired by the body part for authenticating the person, comprising the steps: illuminating the body part using a lighting unit (110) is provided with at least one pattern mask (112) to project a mask pattern onto the body part; Capturing an image of the illuminated body part comprising the projected mask pattern using an image capture unit (114); Extracting the biometric characteristics from the image captured by the capture unit (114) using a processing unit (118); Compare the extracted biometric data with registered identification data for authenticating the person using an authentication unit (120). CH 713 061 A1 [9] 9. The biometric authentication method according to claim 8, wherein the pattern mask has a regular pattern of polygons, preferably triangles, quadrangles, squares, pentagons or hexagons, and is used to determine the optimal distance between the body part and the detection unit. [10] 10. Biometric authentication method according to one of claims 8 to 9, wherein the mask pattern is used to guide the body part in order to determine an optimal distance between the body part and the detection unit. [11] 11. The biometric authentication method according to claim 8 to 10, wherein the illumination unit illuminates the body part with visible light and NIR light in order to obtain combined images, which structures of the body part can be detected under visible light and under NIR light, respectively, and the mask pattern exhibit. [12] 12. The biometric authentication method according to claim 11, wherein the mask pattern is used to extract the biometric characteristic data. [13] 13. Biometric authentication device for capturing images for carrying out the method according to one of claims 8 to 12, the device comprising an illumination unit (110) for illuminating the body part; an image capturing unit (114) for capturing an image of the illuminated body part; characterized in that the lighting unit (110) is provided with at least one pattern mask (112) in order to project a mask pattern (701) onto the body part, so that the captured image comprises the projected mask pattern. [14] The biometric authentication device according to claim 13, the device further comprising a processing unit (118) for extracting the biometric identification data from the image captured by the image acquisition unit (114) and an authentication unit (120) for comparing the extracted biometric data with registered identification data for authenticating the person , [15] The biometric authentication device according to claim 13, the device further comprising means for connecting the device to an external processing unit (118) for extracting the biometric characteristic data from the image captured by the image acquisition unit (114) and means for connecting the device to an authentication unit (120) to compare the extracted biometric data with registered characteristics to authenticate the person. CH 713 061 A1
类似技术:
公开号 | 公开日 | 专利标题 EP2711869A2|2014-03-26|Method and device for recording fingerprints based on fingerprint scanners in reliably high quality DE602004004469T2|2007-11-15|Arrangement and method for personal identification EP1073988B1|2002-01-16|System for contactless recognition of hand and finger lines DE60133788T2|2009-06-25|An image processing apparatus and method for determining features of faces DE19609455C2|2000-06-08|Face image processing system DE69724713T2|2004-07-01|Animal body identification system DE2801927C2|1990-06-21| EP1693781B1|2010-11-03|Process and arrangement for optical recording of biometric fingerdata EP1627343B1|2007-10-10|Method and device for the recognition of biometric data following recording from at least two directions DE112005002395T5|2007-09-27|Line of sight detection method EP3245943A1|2017-11-22|Method for the contactless determination and processing of sleep movement data CH713061A1|2018-04-30|System and method for non-contact biometric authentication. WO2002041769A1|2002-05-30|Optical stimulation of the human eye DE19516662A1|1996-11-14|Identification method for three-dimensional objects DE102017203649A1|2017-09-07|FACE IMAGE PROCESSING DEVICE DE102016009619A1|2018-02-01|Method for detecting the spatial extent of a camera object as part of a living recognition for devices for recording person-specific data DE10123561A1|2001-10-18|Person identification with 3-dimensional finger group analysis involves analyzing fingerprint, fingertip shape from different perspectives to prevent deception using planar images DE19954047B4|2004-08-05|Method and arrangement for recording multidimensional eye movements and marking tincture therefor EP3214602B1|2019-05-08|Method for three-dimensional recording of objects DE102017222675A1|2019-06-13|Person recognition by means of a camera DE102010054168B4|2017-09-07|Method, device and program for determining the torsional component of the eye position DE102006021002A1|2007-11-08|Person`s identification and verification method, involves consulting brightness values of iris for differentiation between back and connective tissue structures in order to skeletonize image of structures subsequently EP3142068A1|2017-03-15|Method for three-dimensional recording of objects DE112016006913T5|2019-02-14|Image processing apparatus, operating method of the image processing apparatus, and operation program of the image processing apparatus EP2972071B1|2016-12-21|Device for measuring a slaughter animal body object
同族专利:
公开号 | 公开日 WO2018073335A1|2018-04-26| CH713061B1|2021-03-31|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20060120576A1|2004-11-08|2006-06-08|Biomagnetic Imaging Llc|3D Fingerprint and palm print data model and capture devices using multi structured lights and cameras| US20080107309A1|2006-11-03|2008-05-08|Cerni Consulting, Llc|Method and apparatus for biometric identification| US20080211628A1|2007-03-01|2008-09-04|Sony Corporation|Biometric authentic device| CN201302723Y|2008-10-29|2009-09-02|北京市新技术应用研究所|Online multi-spectral palm image collecting instrument| WO2012041826A1|2010-09-28|2012-04-05|Icognize Gmbh|Method and device for the non-contact detection of biometric features| EP2854097A1|2012-05-22|2015-04-01|Fujitsu Limited|Bio-information processing device, bio-information processing method, and program| JP2002092616A|2000-09-20|2002-03-29|Hitachi Ltd|Individual authentication device| JP4387643B2|2002-07-31|2009-12-16|富士通株式会社|Processing device with personal recognition function| JP4546168B2|2004-06-28|2010-09-15|富士通株式会社|Biometric authentication system registration method, biometric authentication system and program thereof| DE602005016339D1|2004-06-28|2009-10-15|Fujitsu Ltd|Biometric authentication system and registration procedure| CN101946262B|2008-02-15|2016-04-20|富士通株式会社|The camera of biological identification and biological authentication apparatus| US9223955B2|2014-01-30|2015-12-29|Microsoft Corporation|User-authentication gestures| US9355236B1|2014-04-03|2016-05-31|Fuji Xerox Co., Ltd.|System and method for biometric user authentication using 3D in-air hand gestures|WO2020136883A1|2018-12-28|2020-07-02|株式会社ジェーシービー|Authentication system|
法律状态:
2018-08-31| PUE| Assignment|Owner name: SMART SECURE ID AG, CH Free format text: FORMER OWNER: SMART SECURE ID IN SWEDEN AB, SE | 2020-10-15| PFA| Name/firm changed|Owner name: SMART SECURE ID AG, CH Free format text: FORMER OWNER: SMART SECURE ID AG, CH | 2020-10-30| NV| New agent|Representative=s name: PRINS INTELLECTUAL PROPERTY AG, CH |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 CH01404/16A|CH713061B1|2016-10-19|2016-10-19|System and method for contactless biometric authentication.|CH01404/16A| CH713061B1|2016-10-19|2016-10-19|System and method for contactless biometric authentication.| PCT/EP2017/076685| WO2018073335A1|2016-10-19|2017-10-19|System and method for contactless biometric authentication| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|